Alternating minimization methods for strongly convex optimization

نویسندگان

چکیده

Abstract We consider alternating minimization procedures for convex and non-convex optimization problems with the vector of variables divided into several blocks, each block being amenable respect to its while maintaining other blocks constant. In case two we prove a linear convergence rate an procedure under Polyak–Łojasiewicz (PL) condition, which can be seen as relaxation strong convexity assumption. Under assumption in many-blocks setting, provide accelerated depending on square root condition number opposed just non-accelerated method. also problem finding approximate non-negative solution system equations A ⁢ x = y {Ax=y} Kullback–Leibler (KL) divergence between Ax y .

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Alternating Proximal Gradient Method for Convex Minimization

In this paper, we propose an alternating proximal gradient method that solves convex minimization problems with three or more separable blocks in the objective function. Our method is based on the framework of alternating direction method of multipliers. The main computational effort in each iteration of the proposed method is to compute the proximal mappings of the involved convex functions. T...

متن کامل

Inexact Alternating Direction Methods of Multipliers for Separable Convex Optimization

Abstract. Inexact alternating direction multiplier methods (ADMMs) are developed for solving general separable convex optimization problems with a linear constraint and with an objective that is the sum of smooth and nonsmooth terms. The approach involves linearized subproblems, a back substitution step, and either gradient or accelerated gradient techniques. Global convergence is established. ...

متن کامل

Subgradient methods for convex minimization

Many optimization problems arising in various applications require minimization of an objective cost function that is convex but not di erentiable. Such a minimization arises, for example, in model construction, system identi cation, neural networks, pattern classi cation, and various assignment, scheduling, and allocation problems. To solve convex but not di erentiable problems, we have to emp...

متن کامل

An Extragradient-Based Alternating Direction Method for Convex Minimization

In this paper, we consider the problem of minimizing the sum of two convex functions subject to linear linking constraints. The classical alternating direction type methods usually assume that the two convex functions have relatively easy proximal mappings. However, many problems arising from statistics, image processing and other fields have the structure that only one of the two functions has...

متن کامل

Convex Optimization for Parallel Energy Minimization

Energy minimization has been an intensely studied core problem in computer vision. With growing image sizes (2D and 3D), it is now highly desirable to run energy minimization algorithms in parallel. But many existing algorithms, in particular, some efficient combinatorial algorithms, are difficult to parallelize. By exploiting results from convex and submodular theory, we reformulate the quadra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Inverse and Ill-posed Problems

سال: 2021

ISSN: ['0928-0219', '1569-3945']

DOI: https://doi.org/10.1515/jiip-2020-0074